Skip to content

[SPARK-52519][PS] Enable divide-by-zero for numeric floordiv with ANSI enabled #51209

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 5 commits into from

Conversation

xinrong-meng
Copy link
Member

@xinrong-meng xinrong-meng commented Jun 17, 2025

What changes were proposed in this pull request?

Enable divide-by-zero for numeric floordiv with ANSI enabled

Why are the changes needed?

Ensure pandas on Spark works well with ANSI mode on.
Part of https://issues.apache.org/jira/browse/SPARK-52169.

Does this PR introduce any user-facing change?

Yes.

>>> spark.conf.get("spark.sql.ansi.enabled")
'true'
>>> ps.set_option("compute.fail_on_ansi_mode", False)
>>> ps.set_option("compute.ansi_mode_support", True)
>>> ps.Series([1, 2]) // 0
0    inf
1    inf
dtype: float64
>>> ps.Series([1, 2]) // ps.Series([0, 0])
0    inf
1    inf
dtype: float64

How was this patch tested?

Unit tests.

(dev3.10) spark (num_floordiv) % SPARK_ANSI_SQL_MODE=true  ./python/run-tests --python-executables=python3.10 --testnames "pyspark.pandas.tests.computation.test_binary_ops FrameBinaryOpsTests.test_binary_operator_floordiv"
...
Tests passed in 6 seconds

(dev3.10) spark (num_floordiv) % SPARK_ANSI_SQL_MODE=false  ./python/run-tests --python-executables=python3.10 --testnames "pyspark.pandas.tests.computation.test_binary_ops FrameBinaryOpsTests.test_binary_operator_floordiv"
...
Tests passed in 4 seconds

Was this patch authored or co-authored using generative AI tooling?

No.


def floordiv(left: PySparkColumn, right: Any) -> PySparkColumn:
return F.when(F.lit(right is np.nan), np.nan).otherwise(
F.when(
F.lit(right != 0) | F.lit(right).isNull(),
F.floor(left.__div__(right)),
).otherwise(F.lit(np.inf).__div__(left))
).otherwise(F.try_divide(F.lit(np.inf), left))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Don't we need to check the config here?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Forgot we need to keep the current error, updated thanks!

Copy link
Member

@ueshin ueshin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, pending tests.

Co-authored-by: Takuya UESHIN <[email protected]>
@@ -369,6 +376,9 @@ def floordiv(self, left: IndexOpsLike, right: Any) -> SeriesOrIndex:
_sanitize_list_like(right)
if not is_valid_operand_for_numeric_arithmetic(right):
raise TypeError("Floor division can not be applied to given types.")
spark_session = left._internal.spark_frame.sparkSession
use_try_divide = is_ansi_mode_enabled(spark_session)
safe_div = F.try_divide if use_try_divide else lambda x, y: x.__div__(y) # type: ignore
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ditto.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sounds good, btw the code is adjusted to avoid E731

@xinrong-meng
Copy link
Member Author

Merged to master, thanks!

haoyangeng-db pushed a commit to haoyangeng-db/apache-spark that referenced this pull request Jun 25, 2025
…I enabled

### What changes were proposed in this pull request?
Enable divide-by-zero for numeric floordiv with ANSI enabled

### Why are the changes needed?
Ensure pandas on Spark works well with ANSI mode on.
Part of https://issues.apache.org/jira/browse/SPARK-52169.

### Does this PR introduce _any_ user-facing change?
Yes.

```py
>>> spark.conf.get("spark.sql.ansi.enabled")
'true'
>>> ps.set_option("compute.fail_on_ansi_mode", False)
>>> ps.set_option("compute.ansi_mode_support", True)
>>> ps.Series([1, 2]) // 0
0    inf
1    inf
dtype: float64
>>> ps.Series([1, 2]) // ps.Series([0, 0])
0    inf
1    inf
dtype: float64
```

### How was this patch tested?
Unit tests.

```
(dev3.10) spark (num_floordiv) % SPARK_ANSI_SQL_MODE=true  ./python/run-tests --python-executables=python3.10 --testnames "pyspark.pandas.tests.computation.test_binary_ops FrameBinaryOpsTests.test_binary_operator_floordiv"
...
Tests passed in 6 seconds

(dev3.10) spark (num_floordiv) % SPARK_ANSI_SQL_MODE=false  ./python/run-tests --python-executables=python3.10 --testnames "pyspark.pandas.tests.computation.test_binary_ops FrameBinaryOpsTests.test_binary_operator_floordiv"
...
Tests passed in 4 seconds
```

### Was this patch authored or co-authored using generative AI tooling?
No.

Closes apache#51209 from xinrong-meng/num_floordiv.

Lead-authored-by: Xinrong Meng <[email protected]>
Co-authored-by: Xinrong Meng <[email protected]>
Signed-off-by: Xinrong Meng <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants